专利摘要:
teleoperation of autonomous cars to negotiate problematic situations. Methods and systems for providing remote support and negotiating problematic situations of unattended vehicle operation based on signal states and vehicle information are described. The disclosed technology receives status data for vehicles by an apparatus such as a vehicle remote support apparatus. Status data indicates a respective current status for vehicles. each vehicle is assigned to its respective vehicle remote support queues based on its status data. An indication that one of the vehicles is requesting remote support is received by the vehicle's remote support device. In response to a determination that a change in status data indicates that unattended operation of one of the vehicles is operating outside the set parameter values, remote support is provided to one of the vehicles via a communications link, transmitting data from instruction to modify the unattended operation of one of the vehicles.
公开号:BR112019010723A2
申请号:R112019010723
申请日:2017-11-30
公开日:2019-10-01
发明作者:Schafer Eric;Utz Hans;Pedersen Liam;Sierhuis Maarten;Bualat Maria;Allan Mark;Della Penna Mauro;Fong Terrence
申请人:Nissan North America Inc;Nasa;
IPC主号:
专利说明:

TELEOPERATION OF AUTONOMIC CARS TO NEGOTIATE PROBLEMATIC SITUATIONS.
TECHNICAL FIELD [001] This application relates to vehicle interfaces for autonomous vehicle monitoring, including non-transitory computer-readable methods, devices, systems and means for remote monitoring and teleoperation of autonomous vehicles.
BACKGROUND [002] The increased use of autonomous vehicles creates the potential for more efficient movement of passengers and cargo through a transportation network. In addition, the use of autonomous vehicles can result in better vehicle safety and more effective communication between vehicles. However, even in situations where autonomous vehicles are individually effective, the interaction between groups of autonomous vehicles can result in unexpected interruptions when autonomous vehicle programming parameters are exceeded. When the number of autonomous vehicles passes a certain limit, it becomes inefficient to use human supervision to monitor autonomous vehicles on a one-to-one basis. As such, existing forms of autonomous vehicle monitoring do not organize the monitoring of autonomous vehicle status in an efficient manner.
SUMMARY [003] Here, aspects, characteristics, elements, implementations and implementations for remote support of autonomous vehicle operation are revealed.
[004] One aspect of the disclosed implementations includes a remote support system for autonomous vehicle operation, the system comprising: a communication system that receives state data from vehicles, and that transmits data from
Petition 870190048983, of 05/24/2019, p. 14/100
2/59 instruction for vehicles, vehicles including a first vehicle; first level control stations that display vehicle status data, first level control stations including a first control station that, in response to a change in status data indicating that the autonomous operation of the first vehicle is operating outside of the defined parameter values, it transmits instruction data to the first vehicle through the communication system and a second level control station associated with the first level control stations that assigns responsibility for vehicles between the first level control stations.
[005] One aspect of the disclosed implementations includes a method for providing remote support for autonomous vehicle operation, the method comprising: receiving, by a vehicle remote support device, status data for vehicles, where the status data includes a current state of the vehicles; assign, by the vehicle remote support device, vehicles to remote vehicle support queues based on the state data; receive, by the vehicle's remote support device, an indication that a vehicle of the vehicles is requesting remote support and in response to a determination that a change in the state data indicates that the autonomous operation of the vehicle is operating outside the parameter values defined, transmit, by the vehicle's remote support device, the instruction data for the vehicle, where the instruction data modify the autonomous operation of the vehicle.
[006] One aspect of the disclosed implementations includes a method for remote support, comprising the method: classifying, by a remote vehicle support device, vehicles based on vehicle status data; generate, by the vehicle remote support device, a display including a plurality of indicators arranged according to the classification, each indicator of the plurality of indicators representing status data of a respective of the vehicles, in
Petition 870190048983, of 05/24/2019, p. 15/100
3/59 that each vehicle is assigned to respective remote vehicle support queues based on classification, each remote vehicle support queue configured to provide remote support using the respective assigned vehicle indicators and in response to a determination of that a change in the state data indicates that the autonomous operation of a first vehicle of the vehicles is operating outside the defined parameter values, interact with a first indicator of the plurality of indicators representing the first vehicle to transmit instruction data to the first vehicle.
[007] These and other aspects of the present disclosure are revealed in the following detailed description of the modalities, the attached claims and the attached figures.
BRIEF DESCRIPTION OF THE DRAWINGS [008] The technology disclosed is best understood from the following detailed description when read in conjunction with the accompanying drawings. It is emphasized that, according to common practice, the various characteristics of the drawings are not to scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity.
[009] Figure 1 is a diagram of an example of a portion of a vehicle in which the aspects, characteristics and elements revealed here can be implemented.
[010] Figure 2 is a diagram of an example of a portion of a vehicle communication and transport system in which the aspects, characteristics and elements disclosed herein can be implemented.
[011] Figure 3 is a block diagram illustrating a remote vehicle assistance center in accordance with the present disclosure. Figure 4 is a diagram illustrating an example of a vehicle indicator for use in an interface in accordance with the present disclosure.
Petition 870190048983, of 05/24/2019, p. 16/100
4/59 [012] Figure 5 is a diagram illustrating an example of a fleet manager interface according to the present disclosure.
[013] Figure 6 is a diagram illustrating an example of a vehicle indicator for use in an interface in accordance with the present disclosure.
[014] Figure 7 is a diagram illustrating an example of a vehicle manager interface according to the present disclosure.
[015] Figure 8 is a diagram illustrating an example of a vehicle manager interface according to the present disclosure.
[016] Figure 9 is a diagram illustrating an example of a vehicle manager interface according to the present disclosure.
[017] Figure 10 is a diagram illustrating an example of a vehicle manager interface according to the present disclosure.
[018] Figure 11 is a flow chart of a technique for remote support according to the present disclosure.
[019] Figure 12 is a flow chart of a technique for remote support of autonomous vehicles according to the present disclosure.
[020] Figure 13 is a flow chart of a technique for remote support of autonomous vehicles according to the present disclosure.
[021] Figure 14 is a flow chart of a technique for remote support of autonomous vehicles according to the present disclosure.
[022] Figure 15 is a flow chart of a method for providing remote support for autonomous operation of a plurality of vehicles in accordance with the present disclosure.
DETAILED DESCRIPTION [023] The monitoring and operation of a large number of autonomous vehicles can present challenges that are different from the challenges of monitoring and operating a smaller number of vehicles, or vehicles that are not.
Petition 870190048983, of 05/24/2019, p. 17/100
5/59 operated autonomously. Regarding the number of vehicles, a small number of vehicles can be controlled by a correspondingly small number of individuals. When the number of vehicles is small, it is often less likely that vehicles will interact adversely with one another (for example, forming traffic congestion). In addition, managing a small number of vehicles as a group will tend to result in smaller gains in the efficiency of vehicle traffic flow, when compared to managing significantly larger vehicle groups. With regard to the autonomous aspect of the vehicles being monitored, the monitoring and operation of non-autonomous vehicles generally does not involve direct control (for example, teleoperation) of one or more vehicles by a human operator. As such, there are fewer control options presented to the non-autonomous operator.
[024] The present disclosure and revealed technology provides a more effective interface for monitoring and operating vehicles, including autonomous vehicles, leveraging a multilayered approach that includes customized interfaces for fleet managers, charged with supervising an entire fleet of autonomous vehicles being monitored by numerous vehicle managers, and the vehicle managers themselves, to whom autonomous vehicles are allocated or assigned for monitoring by a fleet manager and based on analytical data. In addition, the technology revealed provides a way to group autonomous vehicles based on various criteria, such as their level of urgency or shared characteristics. In this way, the management of vehicle monitoring and operation is more efficiently distributed, which allows a more efficient transport network.
[025] As used here, the terminology driver or operator can be used interchangeably. As used here, the terminology "brake" or "slow down" can be used interchangeably. As used herein, the terminology “computer” or “computing device” includes any unit or
Petition 870190048983, of 05/24/2019, p. 18/100
6/59 combination of units, capable of carrying out any method, or any portion or portions thereof, disclosed herein.
[026] As used herein, processor terminology indicates one or more processors, such as one or more processors for special purposes, one or more digital signal processors, one or more microprocessors, one or more controllers, one or more microcontrollers, one or more application processors, one or more application specific integrated circuits, one or more application specific standard products, one or more field programmable gate formations, any other type or combination of integrated circuits, one or more state machines or any combination of them.
[027] As used here, the terminology “memory” indicates any medium or device that can be used by a computer or readable by a computer that can contain, store, communicate or carry in tangible form any signal or information that can be used by or in connection to any processor. For example, a memory can be one or more read-only memories (ROM), one or more random access memories (RAM), one or more registers, low power double data rate (LPDDR) memories, one or more cache memories, one or more semiconductor memory devices, one or more magnetic media, one or more optical media, one or more magneto-optical media or any combination thereof.
[028] As used herein, the terminology "instructions" may include directions or expressions for carrying out any method, or any portion or portions thereof, disclosed herein, and may be performed on hardware, software or any combination thereof. For example, instructions can be implemented as information, such as a computer program, stored in memory that can be executed by a processor to execute any of the respective methods, algorithms, aspects or combinations thereof, as here
Petition 870190048983, of 05/24/2019, p. 19/100
7/59 described. In some implementations, instructions, or a portion of them, can be implemented as a special-purpose processor, or circuit, which may include specialized hardware to execute any of the methods, algorithms, aspects or combinations thereof, as described herein. In some implementations, portions of the instructions can be distributed across multiple processors on a single device, across multiple devices, which can communicate directly or over a network, such as a local area network, a wide area network, the Internet, or a combination from them.
[029] As used here, the terminology example, modality, implementation, aspect, characteristic or element indicates to serve as an example, instance or illustration. Unless expressly stated, any example, modality, implementation, aspect, characteristic or element is independent of each other example, modality, implementation, aspect, characteristic or element and can be used in combination with any other example, modality, implementation, aspect , feature or feature.
[030] As used here, the terminology "determine" and "identify" or any variations thereof, includes selecting, ascertaining, computing, searching for, receiving, determining, establishing, obtaining or otherwise identifying or determining in any way using one or more of the devices shown and described here.
[031] As used here, the terminology “or” is intended to mean an “or” even instead of an exclusive “or”. That is, unless otherwise specified, or evident from the context, "X includes A or B" is intended to indicate any of the natural inclusive permutations. If X includes A; X includes B or X includes both A and B, so "X includes A or B" is satisfied under any of the previous instances. In addition, the articles “one” and “one” as used in this application and the appended claims should generally be interpreted as meaning “one or more”, unless otherwise specified or clear from the context to be
Petition 870190048983, of 05/24/2019, p. 20/100
8/59 directed to a singular form.
[032] Furthermore, for simplicity of explanation, although the figures and descriptions here may include sequences or series of steps or stages, the elements of the methods disclosed herein may occur in several orders or simultaneously. In addition, the elements of the methods disclosed herein can occur with other elements not explicitly presented and described here. In addition, not all elements of the methods described herein may be necessary to implement a method in accordance with this disclosure. Although aspects, characteristics and elements are described here in particular combinations, each aspect, characteristic or element can be used independently or in various combinations with or without other aspects, characteristics and elements.
[033] Implementations of this disclosure provide specific technological improvements for computer networks and autonomous vehicle management, for example, those related to the extension of computer network components to remotely monitor and teleoperate autonomous vehicles. The development of new ways to monitor autonomous vehicle network resources to, for example, identify damage to the system or vehicle and indicate the necessary management or attention and communicate instructions or information between monitoring devices and vehicles is fundamentally related to networks of computers related to autonomous vehicle.
[034] The implementations of this disclosure provide at least one system and method for the remote support of the autonomous operation of a plurality of vehicles. The system includes a communication system which receives status data from the plurality of vehicles and which transmits instruction data to the plurality of vehicles based on the received state data. Each of the plurality of vehicles includes status data. The system includes a plurality of first level control stations (for example, vehicle managers) in which each displays
Petition 870190048983, of 05/24/2019, p. 21/100
9/59 the state data of the plurality of vehicles. In response to a determination that a change in the state data of at least one of the plurality of vehicles indicates that the autonomous operation of at least one of the plurality of vehicles is operating outside the defined parameter values (for example, a delivery vehicle is delayed in relation to an estimated arrival time), the first level control station assigned to monitor at least one of the plurality of vehicles transmits specific instruction data to the at least one of the plurality of vehicles via the communication system. The system also includes a second level control station (for example, fleet manager) associated with the first level control stations that allocate responsibility for vehicles between the first level control stations. Assigning responsibility allows the system to balance a workload between the first-level control stations or otherwise optimize the operating and monitoring capabilities of the first-level control stations.
[035] Patent application US 15 / 463,242, filed on March 20, 2017, entitled “OBJECT MANAGEMENT DISPLAY” is incorporated by reference in its entirety.
[036] To describe some implementations in greater detail, reference is made to the following figures.
[037] Figure 1 is a diagram of an example of a vehicle 1000 in which the aspects, characteristics and elements revealed here can be implemented. Vehicle 1000 includes a chassis 1100, a power train 1200, a controller 1300, wheels 1400/1410/1420/1430 or any other element or combination of elements in a vehicle. Although vehicle 1000 is shown to include four wheels 1400/1410/1420/1430 for simplicity, any other propulsion device or devices, such as a propeller or tread, can be used. In figure 1, the lines connecting the elements, such as the 1200 power train, the 1300 controller
Petition 870190048983, of 05/24/2019, p. 22/100
10/59 and wheels 1400/1410/1420/1430, indicate that information, such as data or control signals, energy, such as electrical energy or torque, or both information and energy, can be communicated between the respective elements. For example, controller 1300 can receive power from power train 1200 and communicate with power train 1200, wheels 1400/1410/1420/1430, or both, to control vehicle 1000, which can include acceleration, deceleration, direction or otherwise control the vehicle 1000.
[038] The power train 1200 includes a power source 1210, a transmission 1220, a steering unit 1230, a vehicle actuator 1240 or any other element or combination of elements of a power train, such as a suspension, a drive shaft, shafts or an exhaust system. Although shown separately, the 1400/1410/1420/1430 wheels can be included in the 1200 power train.
[039] The 1210 power source can be any device or combination of operating devices to provide energy, such as electrical energy, thermal energy or kinetic energy. For example, the 1210 power source includes an engine, such as an internal combustion engine, an electric motor or a combination of an internal combustion engine and an electric motor, and is operative to provide kinetic energy as a driving force to one or more more of the 1400/1410/1420/1430 wheels. In some embodiments, the 1210 power source includes a potential power unit, such as one or more dry cell batteries, such as nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel-metal hydride (NiMH), lithium-ion (Li-ion); solar cells; fuel cells or any other device capable of supplying energy.
[040] The 1220 transmission receives energy, such as kinetic energy, from the 1210 energy source, and transmits the energy to the 1400/1410/1420/1430 wheels to produce a driving force. The 1220 transmission can be controlled by the controller
Petition 870190048983, of 05/24/2019, p. 23/100
11/59
1300, vehicle actuator 1240 or both. Steering unit 1230 can be controlled by controller 1300, vehicle actuator 1240 or both and controls wheels 1400/1410/1420/1430 to steer the vehicle. Vehicle actuator 1240 can receive signals from controller 1300 and can drive or control power source 1210, transmission 1220, steering unit 1230 or any combination of these to operate vehicle 1000.
[041] In some embodiments, the 1300 controller includes a 1310 location unit, a 1320 electronic communication unit, a 1330 processor, a 1340 memory, a 1350 user interface, a 1360 sensor, a 1370 electronic communication interface or any combination thereof. Although shown as a single unit, any one or more elements of the 1300 controller can be integrated into any number of separate physical units. For example, the 1350 user interface and the 1330 processor can be integrated into a first physical unit and the 1340 memory can be integrated into a second physical unit. Although not shown in figure 1, the 1300 controller can include a power source, such as a battery. Although shown as separate elements, the location unit 1310, the electronic communication unit 1320, the processor 1330, the memory 1340, the user interface 1350, the sensor 1360, the electronic communication interface 1370 or any combination of these can be integrated on one or more electronic units, circuits or chips.
[042] In some embodiments, the 1330 processor includes any device or combination of devices capable of manipulating or processing a signal or other information now existing or further developed, including optical processors, quantum processors, molecular processors or a combination thereof. For example, the 1330 processor may include one or more special purpose processors, one or more digital signal processors, one or more
Petition 870190048983, of 05/24/2019, p. 24/100
12/59 more microprocessors, one or more controllers, one or more microcontrollers, one or more integrated circuits, one or more application-specific integrated circuits, one or more programmable gate configurations in the field, one or more programmable logic arrangements, one or more programmable logic controllers, one or more state machines or any combination thereof. The 1330 processor can be operationally coupled with the location unit 1310, the memory 1340, the electronic communication interface 1370, the electronic communication unit 1320, the user interface 1350, the sensor 1360, the power train 1200 or any combination of the same. For example, the processor can be operationally coupled with the 1340 memory via a 1380 communication bus.
[043] In some embodiments, the 1330 processor can be configured to execute instructions including instructions for remote operation that can be used to operate vehicle 1000 from a remote location, including the operations center. Instructions for remote operation can be stored on vehicle 1000 or received from an external source, such as a traffic management center, or server computing devices, which can include cloud-based server computing devices.
[044] Memory 1340 may include any tangible, non-transitory, computer-usable or computer-readable medium, capable of, for example, containing, storing, communicating or carrying machine-readable instructions or any information associated with it, for use by or in connection with the 1330 processor. The 1340 memory is, for example, one or more solid state drives, one or more memory cards, one or more removable media, one or more read-only memories, one or more memory memories random access, one or more disks, including a hard disk, a floppy disk, an optical disk, a magnetic or optical card, or any type of non-transitory medium suitable for storing electronic information
Petition 870190048983, of 05/24/2019, p. 25/100
13/59 or any combination thereof.
[045] The 1370 electronic communication interface can be a wireless antenna, as shown, a wired communication port, an optical communication port or any other wired or wireless unit capable of interfacing with an electronic communication medium with wired or wireless 1500.
[046] The 1320 electronic communication unit can be configured to transmit or receive signals through the wired or wireless electronic communication means 1500, such as through the 1370 electronic communication interface. Although not explicitly shown in figure 1, the electronic communication unit 1320 electronic communication is configured to transmit, receive, or both through any means of communication, wired or wireless, such as radio frequency (RF), ultraviolet (UV), visible light, optical fiber, cable line or a combination thereof. Although figure 1 shows a single electronic communication unit 1320 and a single electronic communication interface 1370, any number of communication units and any number of communication interfaces can be used. In some embodiments, the 1320 electronic communication unit may include a dedicated short-range communications unit (DSRC), a wireless security unit (WSU), IEEE 802.11 p (Wifi-P) or a combination thereof.
[047] Location unit 1310 can determine geolocation information including, but not limited to longitude, latitude, elevation, direction of travel or vehicle speed 1000. For example, the location unit includes a global positioning system unit ( GPS), such as a national electronic maritime association (NMEA) unit enabled in the wide area augmentation system (WAAS), a radio triangulation unit or a combination thereof. Location unit 1310 can be used to obtain information that represents, for example, a current heading of vehicle 1000, a current position of vehicle 1000 in two or three dimensions, a current angular orientation of vehicle 1000, or
Petition 870190048983, of 05/24/2019, p. 26/100
14/59 a combination thereof.
[048] The 1350 user interface can include any unit capable of being used as an interface by a person, including any of a virtual keyboard, a physical keyboard, a touch sensitive keyboard, a monitor, a touch screen, a speaker, a microphone, a video camera, a sensor and a printer. The 1350 user interface can be functionally coupled with the 1330 processor, as shown, or with any other element of the 1300 controller. Although shown as a single unit, the 1350 user interface can include one or more physical units. For example, the 1350 user interface includes an audio interface for carrying out audio communication with a person and a touch screen for carrying out visual and touch communication with the person.
[049] The 1360 sensor can include one or more sensors, such as a sensor array, which can be operable to provide information that can be used to control the vehicle. The 1360 sensor can provide information about the current operational characteristics of the vehicle or its surroundings. 1360 sensors include, for example, a speed sensor, acceleration sensors, a steering angle sensor, traction-related sensors, braking-related sensors or any sensor or combination of sensors, which is operable to report information about some aspect the current dynamic situation of vehicle 1000.
[050] In some embodiments, the 1360 sensor may include sensors that are operable to obtain information about the physical environment around the vehicle 1000. For example, one or more sensors detect the geometry of the road and obstacles, such as fixed obstacles, vehicles , cyclists and pedestrians. In some embodiments, the 1360 sensor may be or include one or more video cameras, laser detection systems, infrared detection systems, acoustic detection systems or any other suitable type of environmental detection device in the vehicle, or combination of devices , now known or developed later. In
Petition 870190048983, of 05/24/2019, p. 27/100
15/59 some modes, sensor 1360 and location unit 1310 are combined.
[051] Although not shown separately, vehicle 1000 may include a trajectory controller. For example, the 1300 controller can include a trajectory controller. The trajectory controller can be operable to obtain information describing a current state of vehicle 1000 and a planned route for vehicle 1000 and, based on this information, determine and optimize a trajectory for vehicle 1000. In some embodiments, the trajectory controller sends operable signals to control vehicle 1000 such that vehicle 1000 follows the path that is determined by the path controller. For example, the output of the trajectory controller can be an optimized trajectory that can be supplied to the 1200 power train, the 400/1410/1420/1430 wheels or both. In some embodiments, the optimized path can be control inputs, such as a set of steering angles, with each steering angle corresponding to a point in time or a position. In some modalities, the optimized path can be one or more paths, lines, curves or a combination of them.
[052] One or more of the 1400/1410/1420/1430 wheels can be a steered wheel, which is pivoted to a steering angle under the control of the 1230 steering unit, a propelled wheel, which is subjected to torque to propel the vehicle 1000 under the control of the 1220 transmission, or a driven and propelled wheel that drives and drives the vehicle 1000.
[053] A vehicle can include units, or elements not shown in figure 1, such as a cabinet, a Bluetooth® module, a frequency-modulated radio (FM) unit, a near-field communication module (NFC), a liquid crystal display (LCD) unit, an organic light-emitting diode (OLED) monitor unit, a speaker, or any combination thereof.
[054] Figure 2 is a diagram of an example of a portion of a system
Petition 870190048983, of 05/24/2019, p. 28/100
16/59 of vehicle transport and communication 2000, in which the aspects, characteristics and elements revealed here can be implemented. The vehicle transport and communication system 2000 includes a vehicle 2100, such as vehicle 1000 shown in figure 1 and one or more external objects, such as an external object 2110, which can include any form of transportation, such as vehicle 1000 shown in figure 1, a pedestrian, cyclist, as well as any form of structure, such as a building. Vehicle 2100 can move through one or more portions of a transport network 2200, and can communicate with external object 2110 through one or more electronic communication networks 2300. Although not explicitly shown in figure 2, a vehicle may cross an area that is not expressly or completely included in a transport network, such as an off-road area. In some embodiments, the transport network 2200 may include one or more of a vehicle detection sensor 2202, such as an inductive loop sensor, which can be used to detect the movement of vehicles on the transport network 2200.
[055] The electronic communication network 2300 can be a multiple access system that provides communication, such as voice communication, data communication, video communication, message communication or a combination thereof, between vehicle 2100, the external object 2110 and an operations center 2400. For example, vehicle 2100 or external object 2110 can receive information, such as information representing the transport network 2200, from operations center 2400 via electronic communication network 2300.
[056] The 2400 operations center includes a 2410 controller device that includes some or all of the characteristics of the 1300 controller shown in figure 1. The 2410 controller device can monitor and coordinate the movement of vehicles, including autonomous vehicles. The controller device 2410 can monitor the state or condition of vehicles, such as vehicle 2100, and external objects, such as the
Petition 870190048983, of 05/24/2019, p. 29/100
17/59 external object 2110. The controller device 2410 can receive vehicle data and infrastructure data including any of the following: vehicle speed, vehicle location, vehicle operational status, vehicle destination, vehicle route, sensor data vehicle, speed of the external object, location of the external object, operational state of the external object, destination of the external object, route of the external object and sensor data of the external object.
[057] In addition, the controller 2410 can establish remote control over one or more vehicles, such as vehicle 2100, or external objects, such as external object 2110. In this way, the controller 2410 can teleoperate vehicles or external objects from a remote location. The controller device 2410 can exchange (send or receive) status data with vehicles, external objects or computing devices, such as vehicle 2100, external object 2110 or a 2500 server computing device, via a wireless communication link. wire, such as the 2380 wireless communication link or a wired communication link, such as the 2390 wired communication link.
[058] The 2500 server computing device may include one or more server computing devices that can exchange (send or receive) state signal data with one or more vehicles or computing devices, including vehicle 2100, the object external 2110 or the 2400 operations center, through the electronic communication network 2300.
[059] In some embodiments, vehicle 2100 or external object 2110 communicates via the 2390 wired communication link, a 2310/2320/2370 wireless communication link, or a combination of any number or types of communication links with or wireless. For example, as shown, vehicle 2100 or external object 2110 communicates over a terrestrial wireless communication link 2310, via a wireless communication link
Petition 870190048983, of 05/24/2019, p. 30/100
18/59 non-terrestrial 2320 or through a combination thereof. In some implementations, a terrestrial wireless communication link 2310 includes an Ethernet connection, a serial connection, a Bluetooth connection, an infrared (IR) connection, an ultraviolet (UV) connection or any connection capable of providing electronic communication.
[060] A vehicle, such as vehicle 2100, or an external object, such as external object 2110, can communicate with another vehicle, external object or with the 2400 operations center. For example, a 2100 host vehicle or subject , you can receive one or more automated messages between vehicles, such as a basic safety message (BSM), from the 2400 operations center, via a 2370 direct communication link or via a 2300 electronic communication network. 2400 operations center can transmit the message to host vehicles within a defined transmission range, such as three hundred meters, or to a defined geographical area. In some embodiments, vehicle 2100 receives a message through a third party, such as a signal repeater (not shown) or another remote vehicle (not shown). In some embodiments, vehicle 2100 or external object 2110 periodically transmits one or more automatic messages between vehicles based on a defined interval, such as one hundred milliseconds.
[061] Automatic messages between vehicles may include vehicle identification information, geospatial status information, such as altitude, latitude or elevation information, geospatial location accuracy information, kinematic status information, such as vehicle acceleration information, yaw rate information, speed information, vehicle position information, braking system status data, choke information, steering wheel angle information or vehicle routing information, or vehicle operating status information vehicle, such as vehicle information
Petition 870190048983, of 05/24/2019, p. 31/100
19/59 vehicle size, headlight status information, arrow signal information, windshield wiper status data, transmission information or any other information or combination of information, relevant to the vehicle's condition streaming. For example, the transmission status information indicates whether the transmission of the transmission vehicle is in a neutral state, a parked state, a forward state or a reverse state.
[062] In some modalities, vehicle 2100 communicates with the electronic communication network 2300 through an access point 2330. Access point 2330, which may include a computing device, can be configured to communicate with the vehicle 2100, with the electronic communication network 2300, with the 2400 operations center or with a combination of these via 2310/2340 wired or wireless communication links. For example, a 2330 access point is a base station, a base transceiver station (BTS), a Node B, an enhanced Node B (eNode B), an Initial Node B (HNode-B), a wireless router , a wired router, a connection port, a relay, a switch, or any similar wired or wireless device. Although shown as a single unit, an access point can include any number of interconnected elements.
[063] Vehicle 2100 can communicate with the electronic communication network 2300 through a 2350 satellite or other non-terrestrial communication device. The 2350 satellite, which may include a computing device, can be configured to communicate with the 2100 vehicle, the 2300 electronic communication network, the 2400 operations center or a combination of them via one or more network connections. communication 2320/2360. Although shown as a single unit, a satellite can include any number of interconnected elements.
[064] The 2300 electronic communication network can be any type of network configured to provide voice, data or any other type of communication
Petition 870190048983, of 05/24/2019, p. 32/100
20/59 electronics. For example, the electronic communication network 2300 includes a local area network (LAN), a wide area network (WAN), a virtual private network (VPN), a mobile or cellular telephone network, the Internet, or any other electronic communication system. . The electronic communication network 2300 can use a communication protocol, such as the transmission control protocol (TCP), the user datagram protocol (UDP), the Internet protocol (IP), the real-time transport protocol (RTP), the hypertext transport protocol (HTTP) or a combination thereof. Although shown as a single unit, an electronic communication network can include any number of interconnected elements.
[065] In some embodiments, vehicle 2100 communicates with the 2400 operations center through the 2300 electronic communication network, 2330 access point or 2350 satellite. The 2400 operations center may include one or more computing devices, which are capable of exchanging (sending or receiving) data from: vehicles such as vehicle 2100, external objects including external object 2110 or computing devices, such as the 2500 server computing device.
[066] In some embodiments, vehicle 2100 identifies a portion or condition of transport network 2200. For example, vehicle 2100 may include one or more sensors in vehicle 2102, such as sensor 1360 shown in figure 1, which includes a speed sensor, wheel speed sensor, camera, gyroscope, optical sensor, laser sensor, radar sensor, sonic sensor or any other sensor or device or combination thereof capable of determining or identifying a portion or condition of the 2200 transport network.
[067] Vehicle 2100 may traverse one or more portions of transport network 2200 using information communicated through electronic communication network 2300, such as information representing transport network 2200, information identified by one or more sensors in vehicle 2102 or a combination of these. The external object 2110 may be capable of all or some of the
Petition 870190048983, of 05/24/2019, p. 33/100
21/59 communications and actions described above in relation to vehicle 2100.
[068] For simplicity, figure 2 shows the vehicle 2100 as the host vehicle, the external object 2110, the transport network 2200, the electronic communication network 2300 and the operations center 2400. However, any number can be used vehicles, networks or computing devices. In some embodiments, the vehicle transport and communication system 2000 includes devices, units or elements not shown in figure 2. Although vehicle 2100 or external object 2110 is shown as a single unit, a vehicle can include any number of interconnected elements .
[069] Although vehicle 2100 is shown in communication with the 2400 operations center via the electronic communication network 2300, vehicle 2100 (and external object 2110) can communicate with the 2400 operations center through any number of connections direct or indirect means of communication. For example, vehicle 2100 or external object 2110 can communicate with the 2400 operations center via a direct communication link, such as a Bluetooth communication link. Although, for simplicity, figure 2 shows one of the transport networks 2200 and one of the electronic communications networks 2300, any number of networks or communication devices can be used.
[070] Figure 3 is a block diagram illustrating a remote vehicle service center 3000 in accordance with the present disclosure. The remote vehicle assistance center 3000 includes a fleet manager 3010, a plurality of vehicle managers including, but not limited to, a vehicle manager 3020 and a vehicle manager 3030, and a plurality of vehicles including, but not limited to to vehicles 3040, 3050, 3060 and 3070.
[071] The fleet manager 3010 can include a device including some or all of the characteristics of the 1300 controller shown in figure 1. The fleet manager 3010 can monitor and coordinate vehicle managers, including vehicle managers.
Petition 870190048983, of 05/24/2019, p. 34/100
22/59 vehicles 3020/3030, as well as the movement of vehicles, including autonomous vehicles and vehicles 3040/3050/3060/3070. The monitoring and coordination of vehicle managers can include anyone: assigning, allocating or withdrawing vehicles to vehicle managers; reviewing and monitoring the performance data of vehicle managers and assigning vehicle managers to a geographical area. In an implementation, there may be multiple fleet managers, which can in turn be managed or come under the authority of other fleet managers.
[072] Vehicle manager 3020 can monitor the state or condition of vehicles, including vehicle 3040 and vehicle 3050. As illustrated in figure 3, vehicle manager 3020 received vehicle 3040 and vehicle 3050. The assignment of vehicles to a vehicle manager can be carried out by a fleet manager, such as fleet manager 3010.
[073] Vehicle manager 3030 can monitor the state or condition of vehicles, including vehicle 3060 and vehicle 3070. As illustrated in figure 3, vehicle manager 3030 received vehicle 3060 and vehicle 3070. The assignment of vehicles to a vehicle manager can be performed by a fleet manager, such as fleet manager 3010. Assigning vehicles to a vehicle manager can also be automated using machine learning techniques.
[074] In an implementation, vehicle managers can assemble or group vehicles, communicate with vehicle occupants, remotely operate vehicles and coordinate vehicle movement through a transport network or various obstacles, such as congestion . Vehicle managers can interact with other vehicle managers to assist in vehicle monitoring and management.
[075] Vehicles, including vehicle 3040/3050/3060/3070, comprise vehicles such as vehicle 2100, as shown in figure 2, which are being
Petition 870190048983, of 05/24/2019, p. 35/100
23/59 monitored or coordinated by fleet manager 3010. Vehicles can be operated autonomously or by a human driver and can exchange (send and receive) vehicle data related to the state or condition of the vehicle and its surroundings, including any of: vehicle speed, vehicle location, vehicle operational status, vehicle destination, vehicle route, vehicle sensor data, external object speed and external object location.
[076] Figure 4 is a diagram illustrating an example of a vehicle indicator 4000 for use in an interface according to the present disclosure. In an implementation, the interface is a fleet manager interface. The vehicle indicator 4000 includes a task status indicator 4010, a vehicle mode indicator 4020, a vehicle occupancy indicator 4030 and a time status indicator 4040.
[077] Task status indicator 4010 can be used to indicate a task that is being performed by a vehicle or that is assigned to the vehicle, including any of: traveling to a collection destination, including traveling to a destination to pick up one or more passengers or cargo; travel to a delivery destination, including travel to a destination to drop off one or more passengers or cargo; traveling to a maintenance destination, including traveling to a destination where maintenance or repairs can be performed on the vehicle and traveling to a refueling destination, including traveling to a destination to refuel the vehicle, including oil stations or electric charging stations .
[078] The characteristics of task status indicator 4010, including shape and color, may correspond to the task being performed by the vehicle. For example, task status indicator 4010 is shown as a square in figure 4, which can indicate, for example, that the vehicle is traveling to a collection destination. In an implementation, a circular shape of the task status indicator 4010 can indicate that the vehicle is traveling to a delivery destination.
Petition 870190048983, of 05/24/2019, p. 36/100
24/59
Different shapes and colors can indicate different tasks being performed by the vehicle.
[079] The vehicle mode indicator 4020 can be used to indicate whether the vehicle is operating in either: an autonomous mode, including a mode in which the vehicle is driving or the vehicle is being driven remotely by a computing device; a directed mode, including a mode in which a human operator is directing the vehicle from a remote location; a manual mode, including a mode in which a human operator is operating the vehicle from within the vehicle and a semi-autonomous mode, including a mode in which the vehicle can switch between autonomous and manual mode based on the state of the vehicle (eg example, assisted braking can be activated when a proximity and acceleration threshold is exceeded) and a mode in which the vehicle is being controlled simultaneously using autonomous resources and human operation. For example, the vehicle mode indicator 4020 is shown as a cross in figure 4, which can indicate, for example, any of the modes mentioned above.
[080] Vehicle occupancy indicator 4030 can be used to indicate any of: whether the vehicle contains one or more passengers and the status of the passengers in the vehicle. In an implementation, a busy vehicle is indicated by the vehicle occupancy indicator 4030 being in a full state (for example, the area within the vehicle occupancy indicator 4030 is the same color as the border around the task status indicator 4010).
[081] The characteristics of the vehicle occupancy indicator 4030, including color, can be used to indicate a theme associated with the vehicle, including any passenger theme, including a request for assistance from a passenger inside the vehicle, a traffic theme , including themes related to traffic congestion, traffic accidents and construction, a decision theme, including themes related to a decision that can be taken by a manager
Petition 870190048983, of 05/24/2019, p. 37/100
25/59 vehicle on whether to take control of the vehicle, redirect the vehicle, establish communication with the vehicle or indicate that an action in relation to the vehicle has been completed, a physical theme with the vehicle's state, including themes related to the operational state vehicle (eg engine status, fuel status). In an implementation, a standard color can be used to indicate that the vehicle is operating in a normal state and that no theme with the vehicle is pending.
[082] The 4040 temporal status indicator can be used to indicate the vehicle's temporal state in relation to an expected or predicted temporal state. In an implementation, the color of the 4040 time indicator can indicate whether the vehicle is ahead of a scheduled time or behind a scheduled time.
[083] The length of the 4040 temporal state indicator can indicate a magnitude deviation from the expected or predicted temporal state. The length of the 4040 temporal state indicator can be proportional to the magnitude of the deviation from the expected or predicted temporal state (for example, directly proportional, inversely proportional, exponentially proportional logarithmic proportionality).
[084] In another implementation, the length of the 4040 time state indicator may be disproportionate to the expected or expected time state deviation (for example, a length of one third indicating a deviation of less than five minutes, a length of two thirds indicating a deviation of more than five minutes and less than fifteen minutes and a total length indicating a deviation greater than fifteen minutes). Other features of the 4040 time indicator can be used to indicate the vehicle's status, for example, a red color can indicate that the vehicle is behind a scheduled time and a green color can indicate that the vehicle is ahead of a time programmed.
Petition 870190048983, of 05/24/2019, p. 38/100
26/59 [085] Figure 5 is a diagram that illustrates an example of a fleet manager 5000 interface according to the present disclosure. The fleet manager interface 5000 can be generated based on one or more instructions that are executable on a computing device, including the controller device 2410 shown in figure 2, and that can be stored in a memory of a computing device, including the 2410 controller.
[086] For example, the fleet manager interface 5000 can be generated by the controller device 2410, based on instructions that are interpreted by a customer computing device that accesses the controller device 2410 through a computer network. The customer's computing device can then generate a representation of the fleet manager 5000 interface on a display device.
[087] The fleet manager interface 5000 includes a portion of the fleet manager 5010, a map portion 5020, a vehicle manager indicator 5030, a vehicle indicator 5040 and a vehicle manager assignment queue 5050, any one of which may be based on data associated with the state of physical objects, including, but not limited to, at least one of the vehicles, roads, buildings and pedestrians.
[088] The fleet manager portion 5010 includes a representation of objects that are being monitored or tracked by the fleet manager and / or vehicle managers, including the association of vehicle managers for vehicles. Objects can include vehicles, including vehicle 2100, as shown in figure 2. Objects can be represented as indicators, such as vehicle indicator 5040, which can be generated as a still image or moving image, such as the indicator vehicle 4000, as shown in figure 4. In addition, the fleet manager portion 5010 can receive input including any between touch inputs, voice inputs and inputs from an input device. As a
Petition 870190048983, of 05/24/2019, p. 39/100
For example, vehicle indicators, including vehicle indicator 5040, can be selected by an operator, such as a vehicle manager. The selection of vehicle indicators can generate data on the state or condition of the respective vehicle represented by the vehicle indicators (for example, the selected vehicle indicator can indicate whether the vehicle is functioning correctly or will arrive at a destination in time).
[089] The 5020 map portion includes a representation of a geographical area, including objects within a predefined geographical area. In an implementation, the predefined geographic area can include a geographic area corresponding to the geographic area that includes all or at least some portion of the vehicles being monitored by one of the vehicle managers. Objects within the geographic area can include any of the vehicles and external objects, including roads, buildings and pedestrians. The 5020 map portion can be input including any between touch inputs, voice inputs and inputs from an input device. The entry in the map portion can generate data about the state or condition of the selected vehicles or external objects.
[090] In one implementation, the 5020 map portion contains the same representation of objects that are displayed in the 5010 fleet manager portion. In another implementation, the number and type of objects displayed between the 5010 fleet manager portion and the 5020 map portion may differ. For example, the vehicle manager can zoom in to a particular geographic area, thereby displaying only a subset of the objects or vehicles that are represented in the 5010 fleet manager portion.
[091] The 5030 vehicle manager indicator is a representation of an identifier for a vehicle manager. Each of the vehicle managers displayed in the fleet manager interface 500 includes a separate vehicle manager indicator. The vehicle manager can be associated with one or more vehicles, which can be
Petition 870190048983, of 05/24/2019, p. 40/100
28/59 distributed or distributed by the fleet manager or dynamically, using machine learning techniques. For example, the fleet manager can modify the number of vehicles assigned to a vehicle manager, including anyone adding or removing vehicles and transferring one or more vehicles from one vehicle manager to another vehicle manager.
[092] The vehicle indicator 5040 is a representation of the state or condition of an autonomous vehicle, the state that includes any of a vehicle task, vehicle occupation, the vehicle's operating mode (for example, autonomous operation or manual operation ), and a vehicle theme, including, but not limited to, a theme with the vehicle's operational status. The vehicle indicator 5040 may include various colors, shapes, patterns, text, pictograms or any combination thereof, to represent aspects of the state or condition of the vehicle indicator. As an example, the 5040 vehicle indicator can represent an autonomous vehicle that is traveling to a destination to pick up a passenger. In addition, the vehicle indicator 5040 can represent an autonomous vehicle that is transporting a passenger and traveling to a destination in order to leave the passenger.
[093] Vehicle manager assignment queue 5050 is a representation of vehicles assigned to a vehicle manager. Vehicles can be assigned to the vehicle manager assignment queue 5050 by the fleet manager or the vehicle managers themselves or automatically using machine learning techniques. For example, a vehicle manager may perceive that they are monitoring many vehicles and may assign a subset of those vehicles to another vehicle manager that has additional monitoring capabilities. As shown in figure 5, the vehicle indicators (for example, the vehicle indicator 5040) within the assignment queue of the vehicle manager 5050 are assigned to the vehicle manager associated with the vehicle manager indicator 5030 "Larry". This vehicle manager indicator can represent the real name of the vehicle manager or a username or other
Petition 870190048983, of 05/24/2019, p. 41/100
29/59 identifier.
[094] Figure 6 is a diagram illustrating an example of a vehicle indicator 6000 for use in an interface according to the present disclosure. In one implementation, the interface is the fleet manager interface 5000 illustrated in figure 5, a vehicle manager interface 7000 illustrated in figure 7 or a vehicle manager interface 9000 illustrated in figure 9.
[095] Vehicle indicator 6000 includes a next task indicator 6010, a current task indicator 6020, a real progress indicator 6030, a deviation magnitude indicator 6040, an expected progress indicator 6050, a 6060 time scale , a time to complete 6070, a time elapsed 6080 and a time compression indicator 6090.
[096] The 6010 next task indicator can be used to indicate a task that is assigned to the vehicle and is not currently being performed. The next task indicator 6010 can indicate a task that will be performed after the performance of a current task (for example, a current task associated with the current task indicator 6020). For example, the next task indicator 6010 may indicate that a delivery will occur after the completion of the current task associated with the current task indicator 6020. In an implementation, an interaction with the next task indicator 6010 (for example, selecting the next task 6010) can show a description of the task that will be performed next.
[097] The current task indicator 6020 can be used to indicate a task that is currently being performed by a vehicle. For example, the current task indicator 6020 may include picking up a passenger at a designated location. In an implementation, an interaction with the current task indicator 6020 (for example, selecting the current task indicator 6020) can show a description of the task that is currently being performed.
[098] The next task indicator 6010 and the current task indicator 6020
Petition 870190048983, of 05/24/2019, p. 42/100
30/59 can be associated with tasks including, but not limited to, anyone traveling to a collection destination, including traveling to a destination to pick up one or more passengers or cargo, traveling to a delivery destination, including traveling to a destination to drop off one or more passengers or cargo, travel to a maintenance destination, including traveling to a destination where maintenance or repairs can be performed on the vehicle and traveling to a refueling destination, including traveling to a destination to refuel the vehicle , including oil stations or electric charging stations.
[099] The shape of the next task indicator 6010 or the current task indicator 6020 may correspond to the task being performed by the vehicle. For example, the next task indicator 6010 is shown as a circle in figure 6, which indicates that the vehicle is traveling to a delivery destination. The circular shape of the current task indicator 6020 can indicate that the vehicle is traveling to a collection destination. The shape can include, but is not limited to, circles, squares, triangles, rectangles, etc.
[0100] A pattern (for example, cross shape, zigzag) on the next task indicator 6010 or the current task indicator 6020 can indicate whether the vehicle is operating in any of the autonomous modes, including a mode in which the vehicle is drive or the vehicle is being driven remotely by a computing device, a directed mode, including a mode in which a human operator is driving the vehicle from a remote location, a manual mode, including a mode in which a human operator is operating the vehicle from within the vehicle and in a semi-autonomous mode, including a mode in which the vehicle can switch between autonomous and manual mode based on the state of the vehicle and a mode in which the vehicle is being controlled simultaneously using autonomous resources and human operation. For example, the vehicle mode indicator 4020 is shown as a cross in figure 4, which can indicate, for example, any of the modes
Petition 870190048983, of 05/24/2019, p. 43/100
31/59 mentioned above.
[0101] The characteristics, including a padding, of the next task indicator 6010 or the current task indicator 6020 can be used to indicate whether the vehicle contains one or more passengers. In an implementation, a busy vehicle is indicated by the next task indicator 6010 or the current task indicator 6020 being in a filled state. For example, no padding (for example, no pattern, no shading and light color) can be used to indicate that the vehicle does not contain occupants.
[0102] The color of the next task indicator 6010 or the current task indicator 6020 can be used to indicate a theme associated with the vehicle, including any of a passenger theme, including a request for assistance from a passenger inside the vehicle , a traffic theme, including themes related to traffic congestion, traffic accidents and construction, a decision theme, including themes related to a decision that can be taken by a vehicle manager on whether to take control of the vehicle, a physical theme with the state of the vehicle, including themes related to the operational state of the vehicle (for example, engine status, fuel status). In an implementation, a standard color can be used to indicate that the vehicle is operating in a normal state and that no theme with the vehicle is pending.
[0103] The actual progress indicator 6030 indicates the actual portion of a route distance that has been traveled, or the time that has passed, en route to a destination. For example, if the progress indicator is at the midpoint of the 6060 timescale, half the route distance has been completed, or half the estimated travel time has passed.
[0104] The expected progress indicator 6050 indicates a portion of a route distance that has been estimated to have been completed by the vehicle by the current time and may include a portion of the estimated time to travel to a destination or
Petition 870190048983, of 05/24/2019, p. 44/100
32/59 a portion of the estimated distance traveled by the vehicle. The deviation magnitude indicator 6040 indicates the portion of the route distance or the portion of the travel time by which the expected progress time (indicated by the expected progress indicator 6050) deviates from the actual progress time (indicated by the progress indicator real 6030).
[0105] The 6060 time scale indicates the total travel time or the total travel distance to complete a route for the vehicle. For example, if the 6060 time scale is representative of a total travel time of thirty minutes, half of the 6060 time scale is fifteen minutes. In the case where the 6060 time scale is for a longer time period, the 6090 time compression indicator may indicate that a portion of the 6060 time scale that is not proportional to the remaining part of the 6060 time scale has expired. For example, the 6090 time compression indicator can indicate that half of the 6060 time scale has run out. For example, the elapsed time 6080 indicates the travel time that has elapsed on the way to a destination.
[0106] The total time to complete a route can be represented by a length of the 6060 timescale that includes the length of the deviation magnitude indicator 6040, the length of time until completion 6070 and the length of elapsed time 6080. As an example, the time to complete 6070 indicates the remaining travel time before the vehicle reaches its destination or completes the associated / assigned task.
[0107] Figure 7 is a diagram illustrating an example of a vehicle manager interface 7000 according to the present disclosure. The vehicle manager interface 7000 can be generated based on one or more instructions that are executable on a computing device, including the 2410 controller device, as shown in figure 2, and that can be stored in a memory device memory. computing, including the 2410 controller.
Petition 870190048983, of 05/24/2019, p. 45/100
33/59 [0108] For example, vehicle manager interface 7000 can be generated by controller device 2410, based on instructions that are interpreted by a customer computing device that accesses controller device 2410 through a computer network . The customer's computing device can then generate a representation of the vehicle manager 7000 interface on a display device.
[0109] The vehicle manager interface 7000 includes a vehicle manager portion 7010, a map portion 7020, a vehicle indicator 7030 and a vehicle indicator 7040, any of which can be based on data associated with the status of the physical objects including any of the vehicles and external objects, including, but not limited to pedestrians, cyclists, roads and buildings.
[0110] The vehicle manager portion 7010 includes a representation of objects being monitored or tracked by the vehicle manager using the vehicle manager interface 7000. A plurality of vehicle managers can monitor a plurality of vehicles, each with your own specific interface. Objects can include vehicles, including vehicle 2100 shown in figure 2. Objects can be represented as indicators, such as vehicle indicator 7030, which can be generated as a variety of images, including, but not limited to, as an image static, dynamic image, moving image, live photo or video feed or any combination of them. In addition, the vehicle manager portion 7010 can receive input including any between touch inputs, voice inputs and inputs from an input device.
[0111] The 7020 map portion includes a representation of a geographical area, including objects within the geographical area. Objects within the geographic area can include any of the vehicles and external objects, including roads, buildings, cyclists and pedestrians. In an implementation, the map portion
Petition 870190048983, of 05/24/2019, p. 46/100
34/59
7020 may have similar or different objects represented as the objects represented by the vehicle manager portion 7010.
[0112] Vehicle indicator 7030 and vehicle indicator 7040 are representations of the state or condition of an autonomous vehicle, including any of a vehicle task, vehicle occupation, vehicle operating mode (for example, autonomous operation or operation), and a vehicle theme, including, but not limited to, a theme with the vehicle's operational status. The vehicle indicator 7030 and vehicle indicator 7040 can include various colors, shapes, patterns, text or pictograms, to represent aspects of the state or condition of the autonomous vehicle.
[0113] As an example, the vehicle indicator 7030 can represent an autonomous vehicle that is traveling to a destination to pick up a passenger. In addition, the vehicle indicator 7040 may represent an autonomous vehicle that is carrying another passenger and traveling to a destination in order to leave that passenger. The different tasks or actions that the respective autonomous vehicles are carrying out result in differences in the graphical display between the vehicle indicators 7030 and 7040 (for example, the vehicle indicator 7030 has a filled circle and the vehicle indicator 7040 has an unfilled square ).
[0114] Figure 8 is a diagram illustrating an example of a vehicle manager interface 8000 according to the present disclosure. The vehicle manager interface 8000 can be generated based on one or more instructions that are executable on a computing device, including the 2410 controller device, as shown in figure 2, and that can be stored in a memory device memory. computing, including the 2410 controller.
[0115] For example, vehicle manager interface 8000 can be generated by controller device 2410, based on instructions that are interpreted by a customer computing device that accesses controller device 2410
Petition 870190048983, of 05/24/2019, p. 47/100
35/59 through a computer network. The customer's computing device can then generate a representation of the vehicle manager 8000 interface on a display device.
[0116] Vehicle manager interface 8000 resembles vehicle manager interface 7000 and includes a portion of vehicle manager 8010 (similar to the portion of vehicle manager 7010, as shown in figure 7), a portion of map 8020 (similar to map portion 7020 as shown in Figure 7), an 8030 vehicle indicator, an 8040 task control, an 8050 resolved control, an 8060 call control and an 8070 redirect control, any of which can be based on data associated with the state of physical objects including, but not limited to, at least one of the vehicles, roads, buildings and pedestrians. In another implementation, the vehicle manager interface 8000 includes different control functions other than task control 8040, resolved control 8050, call control 8060 and redirect control 8070 that allow the vehicle manager to interface com and control various aspects of the respective autonomous vehicle or object being monitored or tracked.
[0117] The vehicle manager portion 8010 includes a representation of objects being monitored or tracked. Objects can include vehicles, including vehicle 2100 shown in figure 2. Objects can be represented as indicators, such as vehicle indicator 8030, which can be generated as a still image or moving image or a different type of image. In addition, the vehicle manager portion 8010 can receive input including any between touch inputs, voice inputs and inputs from an input device. For example, vehicle indicators, including vehicle indicator 8030, can be selected by an operator of the vehicle manager interface 8000, such as a vehicle manager. The selection of vehicle indicators can generate data on the state or condition of the respective vehicle
Petition 870190048983, of 05/24/2019, p. 48/100
36/59 represented by the vehicle indicators (for example, the selected vehicle indicator can indicate whether the vehicle will reach a destination in time).
[0118] The 8020 map portion includes a representation of a geographical area including objects within the geographical area. Objects within the geographic area can include any of the vehicles and external objects, including, but not limited to roads, buildings, cyclists and pedestrians. The 8020 map portion can be input including any between touch inputs, voice inputs and inputs from an input device. The entry for the map portion can generate data on the state or condition of the selected vehicles or external objects. In an implementation, the map portion 8020 may have similar or different objects represented as the objects represented by the vehicle manager portion 8010.
[0119] For example, the selection of a building, such as a stadium, can generate data indicating that a sporting event is taking place at the stadium within a certain period of time. Thus, the vehicle manager can anticipate congestion in the vicinity of the stadium at the conclusion of the sporting event, due to the increased traffic flow resulting from fans leaving the stadium. Therefore, the vehicle manager can redirect or change the completion time of one of the autonomous vehicles they are monitoring and which is scheduled to perform a specific task near the stadium at the conclusion of the sporting event.
[0120] The vehicle indicator 8030 includes a representation of the state or condition of a vehicle (for example, an autonomous vehicle or a vehicle driven by a person) and includes any of a vehicle task, vehicle occupation, mode operating vehicle (for example, autonomous operation or manual operation) and a vehicle theme, including, but not limited to, a theme with the vehicle's operational status. The 8030 vehicle indicator can include several characteristics, including colors, shapes, patterns, text or pictograms, to represent aspects of the
Petition 870190048983, of 05/24/2019, p. 49/100
37/59 vehicle indicator status or condition. As an example, the 8030 vehicle indicator can represent an autonomous vehicle that is traveling to a destination to pick up a passenger. Or the 8030 vehicle indicator can represent an autonomous vehicle that transports a passenger and travels to a destination to drop off the passenger.
[0121] Anyone between task control 8040, resolved control 8050, call control 8060 and redirect control 8070 can be controlled or modified from an input, including any of a user input based on an input received through an input device, including a tactile input device (for example, a keyboard, mouse or touchscreen), an audio input device (for example, a microphone) and a visual input device (for example, a camera). In addition, any of the 8040 task control, 8050 resolved task control, the 8060 call control and the 8070 redirect control can be controlled or modified based on instructions, such as instructions from computer programs (for example, instructions to select vehicle indicators that meet pre-established criteria, such as a common destination).
[0122] Task control 8040 can be used to modify the task that is associated with a vehicle. For example, the vehicle associated with vehicle indicator 8030 may have completed a delivery. A vehicle manager can interact with task control 8040 and modify the vehicle task to indicate that the vehicle should now pick up a passenger instead of completing a previously assigned task. The task can be modified and / or updated while the task is being completed or in relation to a future task. For example, the current task can be set to deliver a package at a certain time, but based on traffic conditions, the current task is updated to pick up a nearby passenger and drop him off at a location that is not within the area
Petition 870190048983, of 05/24/2019, p. 50/100
38/59 traffic congestion. In another example, one of the vehicle's next tasks can be modified / updated / deleted while the vehicle is completing a current unrelated task.
[0123] The 8050 resolved control can be used to indicate that a vehicle-related issue has been resolved or completed by the vehicle manager. For example, after a vehicle manager receives a help request from a vehicle associated with the vehicle indicator 8030 or a passenger vehicle, and provides assistance for the vehicle, the resolved 8050 control can be activated by the vehicle manager to indicate that the issue has been resolved and is no longer pending. In one implementation, activating the 8050 resolved control can modify the vehicle data associated with the vehicle, including a vehicle task urgency that includes an indication of the urgency of a vehicle order or a vehicle task (for example, an ambulance transporting a patient to a hospital). For example, a vehicle transporting a patient in urgent need of medical assistance could send a request to a vehicle manager for an optimized redirect and as soon as the vehicle manager handles that request or concludes that additional assistance is needed, the vehicle manager could interact with the 8050 resolved control to update the order status.
[0124] The 8060 call control can be used to contact and communicate with the vehicle associated with the 8030 vehicle indicator. For example, when the 8060 call control is activated, a vehicle manager can interact with an occupant or passenger of the vehicle associated with the 8030 vehicle indicator. In an implementation, when the 8060 call control is activated, either an audio link or the audio and video link (for example, live video communication feeds) can be established with the vehicle associated with the vehicle indicator 8030.
[0125] The 8070 redirect control can be used to modify
Petition 870190048983, of 05/24/2019, p. 51/100
39/59 a route associated with a vehicle. For example, the vehicle associated with the 8030 vehicle indicator may be in transit to a destination via a route that will pass through highly congested traffic. The 8070 redirect control could be used to redirect the vehicle to avoid entering the area with highly congested traffic. In another implementation, the 8070 redirect control may be a different type of control that provides autonomous vehicle teleoperation.
[0126] Figure 9 is a diagram illustrating an example of a vehicle manager 9000 interface according to the present disclosure. The vehicle manager 9000 interface can be generated based on one or more instructions that are executable on a computing device, including the 2410 controller, and that can be stored in a memory on a computing device, including the controller 2410.
[0127] For example, vehicle manager interface 9000 can be generated by controller device 2410, based on instructions that are interpreted by a customer computing device that accesses controller device 2410 through a computer network. The customer's computing device can then generate a representation of the vehicle manager 9000 interface on a display device.
[0128] The vehicle manager interface 9000 resembles the vehicle manager interface 8000, as shown in figure 8 and includes a vehicle manager portion 9010, a map portion 9020, a vehicle indicator 9030, an indicator 9040 vehicle control, 9050 cluster control and 9060 area selection control, any of which can be based on data associated with the state of physical objects including, but not limited to, at least one of the vehicles, roads, buildings and pedestrians .
[0129] The vehicle manager portion 9010 includes a representation of
Petition 870190048983, of 05/24/2019, p. 52/100
40/59 objects being monitored or tracked. Objects can include vehicles, including vehicle 2100 shown in figure 2. Objects can be represented as indicators, such as vehicle indicator 9030, which can be generated as a still image or moving image or any other type of image. In addition, the vehicle manager portion 9010 can receive input including any between touch inputs, voice inputs and inputs from an input device.
[0130] As an example, vehicle indicators, including vehicle indicator 9030 and vehicle indicator 9040, can be selected by an operator, such as a vehicle manager. The selection of vehicle indicators can generate data on the status or condition of the respective vehicle represented by the vehicle indicators (for example, the selected vehicle indicator can indicate whether the vehicle will reach a destination in time).
[0131] The 9020 map portion includes a representation of a geographical area, including objects within the geographical area. Objects within the geographical area can include any of the vehicles, and external objects, including, but not limited to roads, buildings, cyclists and pedestrians. The 9020 map portion can receive input including any between touch inputs, voice inputs and inputs from an input device. The entry for the map portion can generate data on the state or condition of the selected vehicles or external objects. For example, selecting a building, such as a stadium, can generate data indicating that a sporting event is taking place at the stadium within a certain period of time. Thus, the vehicle manager can anticipate congestion in the vicinity of the stadium at the conclusion of the sporting event, due to the increased traffic flow resulting from fans leaving the stadium and redirecting the vehicle.
[0132] The vehicle indicator 9030 and the vehicle indicator 9040 are
Petition 870190048983, of 05/24/2019, p. 53/100
41/59 representations of the state or condition of two separate autonomous vehicles, including any of a vehicle task, vehicle occupation, the vehicle's operating mode (for example, autonomous operation or manual operation) and a vehicle theme, including a theme with the vehicle's operational status. The vehicle indicator 9030 and the vehicle indicator 9040 can include various colors, shapes, patterns, text or pictograms, to represent aspects of the state or condition of the vehicle indicator. As an example, the vehicle indicator 9030 can represent an autonomous vehicle that is traveling to a destination to pick up a passenger. In addition, the 9040 vehicle indicator can represent an autonomous vehicle that is transporting another passenger and traveling to a destination in order to leave that passenger.
[0133] The 9050 grouping control and the 9060 area selection control include a control element that can be controlled or modified based on an input, including any user input based on an input received by an input device, including a tactile input device (for example, a keyboard, mouse, or touchscreen), an audio input device (for example, a microphone), and a visual input device (for example, a camera). In addition, the grouping control 9050 and the area selection control element 9060 can be controlled or modified based on instructions, such as computer program instructions (for example, instructions for selecting vehicle indicators that meet pre-established criteria , such as a common destination).
[0134] Area selection control 9060 is a control element that can be controlled or modified to select a section of the 9020 map portion. For example, a rectangular section of the map can be selected or highlighted by the vehicle manager to define that vehicles within the geographical area of the map will be monitored and selected or grouped in the vehicle manager portion
Petition 870190048983, of 05/24/2019, p. 54/100
42/59
9010 which corresponds to the selected portion of the 9020 map portion. Objects within the selected section of the 9020 map portion can be monitored and organized according to a grouping criterion, including any common routes, destinations and starting points. As shown in figure 9, cluster control 9050 can indicate that vehicle indication 9030 and vehicle indication 9040 are part of a group of groups based on shared grouping criteria (for example, they share similar routes).
[0135] Figure 10 is a diagram illustrating an example of a 10000 vehicle manager interface according to the present disclosure. The vehicle manager interface 10000 can be generated based on one or more instructions that are executable on a computing device, including the 2410 controller device, as shown in figure 2, and that can be stored in a memory device memory. computing, including the 2410 controller.
[0136] For example, the vehicle manager interface 10000 can be generated by the controller device 2410, based on instructions that are interpreted by a customer computing device that accesses the controller device 2410 through a computer network. The customer's computing device can then generate a representation of the 10000 vehicle manager interface on a display device.
[0137] The 10000 vehicle manager interface includes a vehicle indicator 10010, a path indicator 10020, an external object indicator 10030, an obstruction indicator 10040, a pedestrian indicator 10050, any of which can be based on data associated with the state of physical objects including, but not limited to, at least one of the vehicles, roads, buildings and pedestrians. A plurality of configurations of external objects, obstructions, pedestrians and any combination of them can be displayed in the vehicle manager 10000 interface. The vehicle indicator 10010 can be used to represent a vehicle. In this
Petition 870190048983, of 05/24/2019, p. 55/100
For example, the vehicle is represented as a three-dimensional model, however, the vehicle indicator 10010 can be represented in different ways including any of a two-dimensional image and a pictogram, such as an icon.
[0138] The trajectory indicator 10020 can be used to represent a trajectory between the vehicle's current location and a vehicle's destination. In an implementation, a vehicle manager can guide the vehicle associated with vehicle indicator 10010 along the trajectory indicated by trajectory indicator 10020. For example, when providing remote assistance to a vehicle associated with vehicle indicator 10010, a trajectory indicator , like a virtual lane, can be generated in order to provide a visual representation of the path the vehicle can travel and which is illustrated by the path indicator 10020.
[0139] The external object indicator 10030 can be used to represent external objects, such as other vehicles, that could alter the intended route of the vehicle as an example. The 10040 obstruction indicator can be used to represent external objects that can obstruct the movement of the vehicle represented by the 10010 vehicle indicator. The 10050 pedestrian indicator can be used to represent an external object including a pedestrian or cyclist or other moving object . The pedestrian indicator 10050 can be indicated with a distinct color scheme that is different from other external objects represented by the external object indicator 10030 or the obstruction indicator 10040. In this way, pedestrians can be distinguished from other types of external objects for provide additional awareness and avoidance capabilities. In an implementation, the external object indicator 10030, the obstruction indicator 10040 and the pedestrian indicator 10050, or any combination thereof, can be represented by the same or similar type of indicator that covers all objects that could affect at least one parameter (eg route, travel time, etc.) of the vehicle represented by the 10010 vehicle indicator.
Petition 870190048983, of 05/24/2019, p. 56/100
44/59 [0140] The steps, or operations, of any method, process or algorithm described in connection with the implementations of the technology disclosed here, can be implemented in hardware, firmware, software executed by hardware, circuits or any combination thereof. To facilitate the explanation, processes 11000 to 14000, shown in figures 11 to 14, are represented and described as a series of operations. However, the operation according to this disclosure can occur in several orders or simultaneously. In addition, operations in accordance with this disclosure may occur with other operations not shown and described here.
[0141] Figure 11 is a flow chart of an 11000 technique for remote support according to the present disclosure. In an implementation, the 11000 technique is used by a vehicle monitoring system or remote support system that includes either a fleet manager, vehicle manager and the above mentioned interfaces. Some or all aspects of technique 11000 for remote support can be implemented in a vehicle including vehicle 1000 shown in figure 1, vehicle 2100 shown in figure 2 or a computing device including controller device 2410 shown in figure 2. In a implementation, some or all aspects of the 11000 technique for remote support can be implemented in a system that combines some or all of the features described in this disclosure.
[0142] In operation 11010, status or status data are received from one or more of the vehicles. In an implementation, state data is received by a communication system or similar device as the vehicle monitoring system. The one or more vehicles may include a device or apparatus (for example, a means of transport), which is used to transport objects, including any one or more passengers and cargo. The one or more vehicles can include either an autonomous vehicle or a vehicle that is driven by a driver
Petition 870190048983, of 05/24/2019, p. 57/100
45/59 human or a semi-autonomous vehicle.
[0143] State data includes, but is not limited to, data indicating the state or condition of vehicles, including any of the kinetic state data relating to any of a vehicle's speed and acceleration, location data, including the geographical location of a vehicle (for example, vehicle latitude and longitude) or the location of the vehicle in relation to another object, vehicle position, including the vehicle's orientation and inclination (for example, inclination of the vehicle on a slope) , the operational status of the vehicle, including the electrical or mechanical state of the vehicle (e.g., health of the vehicle's electrical systems, mechanical systems of the vehicle, tire pressure, etc.), maintenance data related to vehicle maintenance, power source data, including remaining fuel or remaining battery charge, sensor data based on sensors, including optical sensors, audio sensors, motion sensors, internal state data, including temperature and humidity inside the vehicle's passenger cabin and a current task (for example, picking up a passenger) from the vehicle.
[0144] In operation 11020, in response to receiving and processing state data, the communication system transmits instruction data to vehicles, including a first vehicle. Instruction data includes, but is not limited to, instructions to be carried out by the vehicle (for example, the autonomous vehicle) or by a vehicle occupant (for example, a driver). Instructional data can include data associated with any control movement of a vehicle, including changing the acceleration, speed or direction (for example, direction) of the vehicle, enabling or disabling (for example, turning some or all parts on or off) vehicle control system, including mechanical control systems and electrical control systems, enabling or disabling sensors in the vehicle (for example, activating a camera for
Petition 870190048983, of 05/24/2019, p. 58/100
46/59 view the interior of the vehicle or an area outside the vehicle), activation or deactivation of a communication system, including any of the internal communication systems (for example, internal speakers aimed at vehicle occupants) and the systems external communication (for example, external speakers directed to objects or individuals outside the vehicle).
[0145] In an implementation, the transmission of instruction data to a vehicle, such as the first vehicle, comprises the transmission of data from the driving route for implementation by a device or autonomous system of the vehicle. Driving route data can include instructions for modifying an existing driving route for the vehicle. For example, the transmission of the driving route data can be associated with the satisfaction of a condition, including a change in the state data (for example, the transmission of the driving route can occur when a congestion threshold level is determined to be traffic is occurring on the existing driving route).
[0146] In another implementation, the transmission of instruction data to an autonomous vehicle, such as the first vehicle, can be associated with ignoring at least one operating constraint from autonomous operation to allow the first vehicle to cross the driving route . As an example, the autonomous operation of a vehicle may include operational restrictions that prohibit certain types of vehicle actions, including violating a speed limit, including driving too fast or driving too slowly, violating a traffic regulation, including driving in the direction opposite traffic and move away from a road surface. By ignoring an operational constraint, a vehicle is able to perform prohibited actions according to the instructions in the instruction data. For example, bypassing a construction zone could include driving on a portion of the road that was previously restricted, such as when the portion of the road is not a paved road surface and is instead a dirt road.
Petition 870190048983, of 05/24/2019, p. 59/100
47/59 [0147] In operation 11030, vehicle status data is generated and displayed on a vehicle monitoring system interface. The state data can be generated as, or converted to, state data indicators associated with the state data. The status data indicators can be displayed at control stations, including first level control stations (for example, fleet manager control stations that may include fleet manager interfaces) or at second level control stations ( for example, vehicle manager control stations that can include vehicle manager interfaces). As an example, top-level control stations may include computing devices or computing devices that generate state data on a display device, including a monitor, television or other device that can display images. In addition, status data indicators can be released to a screen, such as the fleet manager 5000 interface shown in figure 5.
[0148] In operation 11040, one of the first level control stations determines whether a change in the state data indicates that the autonomous operation of the first vehicle is operating outside (for example, exceeds) the values of the defined value of the parameter. The defined parameter values include any one or more states or conditions of the state data that are defined to be within a predetermined range or within the values of a predetermined threshold. For example, the power source data in the state data may indicate that the vehicle's fuel tank has a small amount of oil / fuel or that most of the fuel has been spent or is below a level threshold parameter value. fuel (for example, the available fuel value for the parameter value set for available fuel indicates that the vehicle is operating outside the defined available fuel parameter value).
[0149] In response to a determination that state data indicates
Petition 870190048983, of 05/24/2019, p. 60/100
48/59 that the autonomous operation of the first vehicle is operating outside the defined parameter values, process 11000 proceeds to operation 11050. In response to a determination that the state data indicates that the autonomous operation of the first vehicle is operating within of the defined parameter values, process 11000 returns to operation 11010.
[0150] In an implementation, a deviation value can be generated based on a comparison of the location data and the kinetic state data of the first vehicle. The determination of whether the autonomous operation of the first vehicle is operating outside the defined parameter values can be based on a comparison of the deviation value with a deviation limit. For example, status data can include route data that indicates a route and destination for a vehicle. By comparing a deviation value (indicating the extent to which the vehicle's position deviates from the route) with a deviation threshold, it can be determined whether the autonomous operation of a vehicle, including the first vehicle, is operating outside the values of defined parameters.
[0151] In operation 11050, a different set of instruction data is transmitted to one or more vehicles, including the first vehicle. The different set of instruction data can be released as instructions via an in-vehicle communication system. For example, instructions can be released in either between a visual output, including text or images generated on a display device (for example, a video monitor), an audible output, including sounds or words generated through a system of speaker and a tactile output, including vibrations generated through a portion of the vehicle, including a portion of the vehicle that is accessible to a vehicle occupant. The different set of instruction data is transmitted in response to the determination made in operation 11040, in which a defined parameter condition or values are analyzed.
[0152] In operation 11060, a control station, such as a station
Petition 870190048983, of 05/24/2019, p. 61/100
49/59 second level control (for example, a fleet manager) associated with, or linked to, the first level control stations (for example, vehicle managers), distributes monitoring and control over vehicles to the first level control stations. By distributing vehicle monitoring and control to several top-level control stations, the workload between the vehicle manager's control stations can be balanced. For example, a fleet manager may notice that between two vehicle managers, one of the two vehicle managers is monitoring five vehicles and the other of the two vehicle managers is monitoring only one vehicle. As a result, the fleet manager can redistribute some vehicles from the first to the other of the two vehicle managers. By balancing the vehicle's workload at the control stations (or remote vehicle support queues), the computing resources in the vehicle monitoring system (or the vehicle's remote support device) can be used more effectively.
[0153] Figure 12 is a flow chart of a technique 12000 for remote support of autonomous vehicles according to the present disclosure. The 12000 process is used by a vehicle monitoring system, such as a remote vehicle support device. Some or all aspects of technique 12000 for remote support can be implemented in a vehicle including vehicle 1000 shown in figure 1, vehicle 2100 shown in figure 2 or a computing device including controller device 2410 shown in figure 2. In an implementation, some or all aspects of the 12000 process for remote support can be implemented in a system that combines some or all of the features described in this disclosure.
[0154] In operation 12010, the status data for a plurality of vehicles (for example, autonomous vehicles) are received by the vehicle's remote support device. The state data can indicate the respective current states (for example, state information) and data for each of the plurality of
Petition 870190048983, of 05/24/2019, p. 62/100
50/59 vehicles. In other words, each vehicle has its own status data that can be received by the vehicle's remote support device. In one implementation, the remote vehicle support apparatus includes a plurality of remote vehicle support queues comprising respective control stations and which are managed or monitored by vehicle managers. Each control station can receive status data for one or more vehicles other than the plurality of vehicles that have been assigned to the respective control system (for example, by the fleet manager).
[0155] In operation 12020, each of the plurality of vehicles is assigned to respective remote vehicle support queues, based on the state data. The assignment can be carried out by the vehicle's remote support device or by a fleet manager of the vehicle's remote support device. Assigning vehicles to remote vehicle support queues may include assigning one or more of the plurality of vehicles based on a comparison of a portion of the status data with assignment criteria to determine the inclusion of the vehicle in the respective remote support queue vehicle. For example, vehicle status data may include an indication of the vehicle's destination. Consequently, vehicles with the same destination could be assigned to the same queue, in which, for example, vehicles could be monitored and managed (including remote operation) by the same vehicle manager assigned to the vehicle's remote support queue.
[0156] In operation 12030, an indication of a first vehicle of the plurality of vehicles is received by the vehicle's remote support apparatus. The indication may include an indication that the first vehicle is requesting remote support. For example, the indication can include any of a communication from an occupant of the first vehicle and an indication signal including the state or state data of the first vehicle in a period of time relative to the order time. For example, the status data in the indication signal can
Petition 870190048983, of 05/24/2019, p. 63/100
51/59 be used to determine the identity, location and state of the first vehicle associated with the remote support request and the condition of the first vehicle when the request was made.
[0157] In operation 12040, after receiving the indication for operation 12030 and before providing remote support to the first vehicle through operation 12050, the first vehicle can be optionally assigned from one control station to another control station based on the ability to to a control station to resolve the request received by operation 12030. For example, a vehicle (such as the first vehicle) can be assigned from a first control station to a second control station (from a plurality of control stations ) responsive to a change in status data for any of the plurality of vehicles (including status data for the first vehicle). The change in status data for any of the plurality of vehicles can be based on an entry for the remote vehicle support device including a display of the status data for each of the plurality of vehicles assigned to the first control station and each the plurality of vehicles assigned to the second control station.
[0158] For example, based on a vehicle manager interacting with a portion of a touchscreen, including the introduction of an indication that a task (for example, a delivery) has been completed or is no longer needed, the vehicle can be assigned to the lowest priority control station. In another example, if a vehicle manager is assigned to an increasing number of high priority vehicles and associated tasks, some of these vehicles can be redistributed to another vehicle manager that is not monitoring as many high priority vehicles and therefore has resources additional steps to do this.
[0159] In operation 12050, remote support is provided for one vehicle, such as the first vehicle. Remote support can be provided over a connection
Petition 870190048983, of 05/24/2019, p. 64/100
52/59 communications of the vehicle remote support device, transmitting instruction data to modify the autonomous operation of the vehicle. In an implementation, instructional data can be provided from a remote vehicle support queue that includes the first vehicle. For example, remote vehicle support may be provided by a vehicle manager assigned to the remote vehicle support queue that includes the vehicle and which is designated for monitoring by the vehicle manager.
[0160] In operation 12060, a map display is generated by the vehicle's remote support device. The map display may include a geographic map that includes respective locations of the plurality of vehicles, including a first set of the plurality of vehicles assigned to a first row of remote vehicle support monitored by a first vehicle manager from a plurality of vehicle managers (or by a first control station from a plurality of first level control stations). As an example, the map display can include a display, such as the 5020 map portion, shown in figure 5, and the 7020 map portion, shown in figure 7 [0161] In operation 12070, a vehicle state or status display is generated by the remote vehicle support device. The vehicle status display includes a representation of the status data for the first set of the plurality of vehicles as respective indicators. The first status data of the first vehicle can be represented by a first indicator. For example, the vehicle status display includes indicators such as vehicle indicator 4000, shown in figure 4, and vehicle indicator 6000, shown in figure 6. In one implementation, the map display generated through operation 12060 includes a display of the same vehicles as the first set of the plurality of vehicles within the vehicle status display that is adjacent to the map display. In an implementation, the vehicle status display is similar to the
Petition 870190048983, of 05/24/2019, p. 65/100
53/59 vehicle 7010 from vehicle manager interface 7000 illustrated in figure 7.
[0162] Figure 13 is a flow chart of a technique 13000 for remote support of autonomous vehicles according to the present disclosure. Some or all aspects of technique 13000 for remote support can be implemented in a vehicle including vehicle 1000 shown in figure 1, vehicle 2100 shown in figure 2 or a computing device including controller device 2410 shown in figure 2. In a implementation, some or all aspects of the 13000 technique for remote support can be implemented in a system that combines some or all of the features described in this disclosure.
[0163] In operation 13010, responsive to an input signal for a first indicator (for example, a vehicle manager clicks on a vehicle indicator associated with a first vehicle), a remote support interface for one or more vehicles, including a first vehicle, is generated and displayed. The remote support interface displays image data, including, but not limited to, at least one image from the vehicle's camera (for example, a camera mounted outside the vehicle that films the surrounding environment). The transmission of instruction data that is sent by the vehicle manager to modify the autonomous operation of the vehicle is initiated via the remote support interface. In one implementation, the image data includes anything between an image of a passenger inside the first vehicle, an image of an obstruction in front of the first vehicle, and any image associated with the first vehicle.
[0164] In operation 13020, a remote support interface is generated and is used to select an instruction indication from the remote support interface. The remote support interface can be coupled to an input device through which instructional signals can be inserted. For example, a remote support interface operator can select an instruction including, but not limited to, “changing lanes” intended for transmission to one of the vehicles that the
Petition 870190048983, of 05/24/2019, p. 66/100
54/59 operator is monitoring within its respective remote vehicle support queue (or first level control station). The instruction can be selected based on the vehicle status data which can, for example, indicate that there is an accident blocking one of the lanes ahead and changing lanes at the current time could save travel time.
[0165] In operation 13030, based on image data that may indicate an obstruction associated with a current route from the first vehicle that can negatively impact parameters, including, but not limited to travel time, a new route can be determined for allow the vehicle to transit around the obstruction (or to avoid obstruction altogether) which deviates from the current route determined through the autonomous operation of the first vehicle. This example highlights how remote support and control of the first vehicle that is running autonomously can optimize the performance of the first vehicle.
[0166] In operation 13040, the new route that is generated can be included in the instruction data that is transmitted to the vehicle that is obstructed. For example, instructional data can be transmitted over a network including a wireless network (for example, cellular network) or any other communication link, system or platform between the remote support system and the vehicle being monitored / supported.
[0167] In operation 13050, the status data of the other vehicles, except the first vehicle, which have a respective route that will similarly cross (or be in close proximity to) anyone between the vehicle and the obstruction are analyzed. For example, sensors in the first vehicle can be used to determine when the first vehicle or an external object, including another vehicle, will intercept. Aggregated data (including current data, route data, etc.) collected from the entire plurality of vehicles can also be used to determine whether any other vehicles could be affected by the same obstruction.
Petition 870190048983, of 05/24/2019, p. 67/100
55/59 [0168] In operation 13060, if a determination is made that other vehicles will also be impacted by the obstruction, remote support is provided for at least one of the vehicles other than the first vehicle. Remote support can be provided by transmitting instruction data to modify the autonomous operation of at least one of the other vehicles to avoid obstruction or to wait until the obstruction is eliminated. For example, when it is determined that an external object, including another vehicle, is on a path that intercepts a vehicle, instruction data can be sent to the vehicle to request that the vehicle change its path. Instruction data can also include an instruction for the vehicle to use an external communication channel, such as an external speaker, to indicate to external objects, including pedestrians and other vehicles, requesting that external objects change their trajectory so as not to obstruct the passage of the vehicle.
[0169] Figure 14 is a flow chart of a technique 14000 for remote support of autonomous vehicles according to the present disclosure. Some or all aspects of the technique 14000 for remote support can be implemented in a vehicle including vehicle 1000 shown in figure 1, vehicle 2100 shown in figure 2 or a computing device including controller device 2410 shown in figure 2. In a implementation, some or all aspects of the 14000 technique for remote support can be implemented in a system that combines some or all of the features described in this disclosure.
[0170] In operation 14010, a remote vehicle support device classifies vehicles, including a first vehicle, based on the state (or state) data of the vehicles. The state data of the first vehicle includes a theme state of the first vehicle. The theme state indicates anyone between a normal state, such as when the vehicle, vehicle occupants and external objects around the vehicle are within the normal operating parameter values, a theme with a passenger, a theme with traffic , a physical theme of the first vehicle
Petition 870190048983, of 05/24/2019, p. 68/100
56/59 and a topic related to a decision of the autonomous control of the first vehicle.
[0171] In operation 14020, a display is generated including a plurality of indicators arranged according to the classification. Each indicator of the plurality of indicators represents status data for a respective vehicle, and each vehicle is assigned to one of the respective remote vehicle support queues (or first level control stations monitored by a vehicle manager) based on classification. Each remote vehicle support row is configured to provide remote support for assigned vehicles using respective assigned vehicle indicators. In other words, each remote vehicle support row monitors a set of vehicles and each indicator for each respective vehicle in the vehicle set is displayed in the vehicle indicator format (such as the vehicle indicator 6000 in figure 6) with an interface remote vehicle support device.
[0172] In operation 14030, in response to an interaction with a first indicator representing a first vehicle (for example, an operator selects or highlights the first indicator displayed in the interface), the instruction data is transmitted to the first vehicle of the vehicles. For example, the interaction can include any one between a haptic input, such as touching a touchscreen, an audible input, such as a voice command and a visual input, such as a gesture captured by an optical sensor. Instruction data can include an instruction to make the first vehicle's autonomous operation make a change in vehicle operation that provides a benefit (for example, following a new travel route to avoid congestion or a traffic obstruction) or that necessary (for example, bringing the vehicle back to a base for refueling or maintenance). For example, instruction data can change the vehicle's travel route to reach the same destination using a different route, but in a shorter time.
Petition 870190048983, of 05/24/2019, p. 69/100
57/59 [0173] In operation 14040, based on an incoming entry, at least one vehicle currently assigned to one of the remote vehicle support queues that is determined to be overloaded is assigned to another of the remote vehicle support queues that it is not overloaded based on the state data. In this way, the workload between remote vehicle support queues can be balanced. In another implementation, vehicles are relocated based on historical data and feedback obtained from similar situations and the ways in which vehicle managers (monitoring each of the remote vehicle support queues) dealt with issues in the past. For example, if a vehicle manager has been criticized for not being so responsive to a particular type of remote support request, the vehicle can be transferred from that vehicle manager to another vehicle manager with better reviews. In addition, vehicles can be redistributed automatically and dynamically using machine learning techniques that use various algorithms to determine the most optimized assignment settings.
[0174] In operation 14050, at least some of the plurality of vehicle indicators are grouped for viewing based on shared entries in the state data. Shared entries may include, but are not limited to, a travel route. For example, the plurality of indicators can be grouped according to a shared destination based on the route of the travel information in the state data. In another example, the plurality of indicators is grouped based on levels of priority or importance.
[0175] In operation 14060, a map display that corresponds to a defined geographical area is generated by the vehicle's remote support device. The map display can be generated simultaneously (for example, side by side) with the display that includes the plurality of indicators. The map display can include an icon for each vehicle within the defined geographic area and the icon can
Petition 870190048983, of 05/24/2019, p. 70/100
58/59 include various attributes of the vehicle being displayed (for example, direction of travel, length of journey, fuel level, etc.).
[0176] Figure 15 is a flow chart of a 15000 method for providing remote support for autonomous operation of a plurality of vehicles in accordance with the present disclosure. The 15000 method includes receiving, by a remote vehicle support device, status data (or status) for the plurality of vehicles, where the status data includes a current status (or status information) of the plurality of vehicles via 15010 Method 15000 includes assigning, by the vehicle remote support apparatus, the plurality of vehicles to a plurality of remote vehicle support rows (or control stations) based on status data via 15020. Each of the plurality of rows Remote vehicle support device includes a subset of the plurality of vehicles and the subset can be redistributed by an operator (e.g., fleet manager) of the remote vehicle support device.
[0177] In figure 15, method 15000 also includes receiving, by the vehicle's remote support device, an indication that a vehicle of the plurality of vehicles is requesting remote support via 15030. In other words, the vehicle's remote support device receives a request from one of the plurality of vehicles that support or help is requested. After receiving the request for remote support or help, and in response to a determination that a change in the state data is operating outside the defined parameters, method 15000 includes the transmission, by the vehicle's remote support device, of instruction to the vehicle via 15040. As such, the instruction data changes the autonomous operation of the vehicle. Instructional data can enable the vehicle to overcome the topic that prompted the remote support request of the vehicle's remote support device.
[0178] The technology revealed provides a remote monitoring system that more efficiently distributes workloads among remote monitoring system operators (for example, human operators, such as
Petition 870190048983, of 05/24/2019, p. 71/100
59/59 fleet managers and vehicle managers), in charge of managing autonomous vehicles. In addition, the technology revealed is able to more effectively organize data (for example, key state and environmental and route data related to monitored vehicles) related to the operation of autonomous vehicles, thereby providing human operators with a way to proactively manage autonomous and semi-autonomous vehicles. The remote monitoring system incorporates external data, such as real-time traffic data, to assist in the instructional data that is generated and transmitted to vehicles for optimal performance. In this way, the flow of autonomous vehicles through a private transport network is improved, which improves the use of available transport resources, increases passenger safety and improves the arrival of passengers and cargo on time.
[0179] Although the revealed technology has been described in relation to certain modalities, it should be understood that the revealed technology should not be limited to the revealed modalities, but, on the contrary, it is intended to cover various modifications and equivalent arrangements included in the scope of the attached claims, the scope of which should be given the broadest interpretation, to include all such modifications and equivalent structures, as permitted by law.
权利要求:
Claims (19)
[1]
1. System for remote support of autonomous vehicle operation, the system FEATURED by the fact that it comprises:
a memory and a processor configured to execute instructions stored in memory to:
receive status data for vehicles, status data including current vehicle status;
assign vehicles to remote support queues for vehicles based on status data;
generate a map display including a geographical map of the respective locations of a set of vehicles that are assigned to remote vehicle support queues;
generate a vehicle status display including the status data for the vehicle set as respective indicators;
receive an indication that a vehicle of the vehicles is requesting remote support and in response to a determination that a change in the state data indicates that the autonomous operation of the vehicle is operating outside the defined parameter values, transmit instruction data to the vehicle for modify the autonomous operation of the vehicle.
[2]
2. System according to claim 1, CHARACTERIZED by the fact that the remote vehicle support queues comprise first level control stations, each first level control station receiving status data for different vehicles.
[3]
3. System according to claim 2, CHARACTERIZED by the fact that the vehicles are assigned to the first level control stations using
Petition 870190048983, of 05/24/2019, p. 96/100
2/5 a second level control station.
[4]
4. System according to claim 1, CHARACTERIZED by the fact that the state data includes any in between kinetic state data and location data for vehicles.
[5]
5. System according to claim 1, CHARACTERIZED by the fact that the processor is still configured to execute instructions stored in memory for:
sensitive to an input signal for a first indicator representing the vehicle status data, display a remote support interface for the vehicle, the remote support interface based on image data including at least one vehicle camera image.
[6]
6. Method for providing remote support for autonomous vehicle operation, the method CHARACTERIZED by the fact that it comprises:
receiving, by means of a remote vehicle support device, status data for the vehicles, the status data including a current status of the vehicles;
assign, by the vehicle remote support device, vehicles to remote vehicle support queues based on status data;
generate, through the remote vehicle support device, a map display including a geographic map of the respective locations of a set of vehicles that are assigned to the remote vehicle support queues;
generate, by the vehicle remote support device, a vehicle status display including the status data for the vehicle set as respective indicators;
receive, by the vehicle remote support device, an indication that a vehicle of the vehicles is requesting remote support and in response to a determination that a change in the state data indicates that the autonomous operation of the vehicle is operating outside the values of
Petition 870190048983, of 05/24/2019, p. 97/100
3/5 defined parameters, transmit the instruction data to the vehicle via the vehicle's remote support device to modify the autonomous operation of the vehicle.
[7]
7. Method according to claim 6, CHARACTERIZED by the fact that the remote vehicle support queues comprise first level control stations, each first level control station receiving status data for different vehicles.
[8]
8. Method according to claim 7, CHARACTERIZED by the fact that it further comprises:
after receiving the indication and before transmitting the instruction data to the vehicle, assign, by a second level control station of the vehicle remote support device, the vehicle of a first control station of the first level control stations to a second control station from the first level control stations in response to a change in status data for vehicles, where the change in status data is based on an entry for the vehicle remote support device that displays the data status for each of the vehicles assigned to the first control station and each of the vehicles assigned to the second control station.
[9]
9. Method according to claim 6, CHARACTERIZED by the fact that it further comprises:
in response to an input signal for a first indicator representing vehicle status data, display a remote support interface for the vehicle, the remote support interface based on image data including at least one vehicle camera image, in that the transmission of instruction data is initiated via the remote support interface.
[10]
10. Method according to claim 9, CHARACTERIZED by the fact that the at least one camera image is an image of a passenger inside the vehicle and the transmission of instruction data is based on the selection of a
Petition 870190048983, of 05/24/2019, p. 98/100
4/5 instruction indication of the remote support interface.
[11]
11. Method according to claim 9, CHARACTERIZED by the fact that the at least one camera image is an image of an obstruction in the vicinity of the vehicle, and further comprises:
determine a new route that avoids obstruction and deviates from a current route determined by the vehicle's autonomous operation and transmit the new route to the vehicle for implementation by the vehicle's autonomous operation.
[12]
12. Method according to claim 11, CHARACTERIZED by the fact that it further comprises:
analyze the state data of at least one of the vehicles other than the vehicle in the vicinity of the obstruction and provide remote support to at least one of the vehicles by transmitting instruction data to at least one of the vehicles to modify the autonomous operation of at least one of the vehicles.
[13]
13. Method according to claim 6, CHARACTERIZED by the fact that it further comprises:
classify vehicles by the vehicle's remote support device based on the status data and generate, by the vehicle's remote support device, the vehicle status display including the respective indicators arranged according to the classification, each indicator of the respective indicators representing the status data of a respective vehicle, where each vehicle is assigned to a respective remote vehicle support queue based on rating, each remote vehicle support queue configured to provide remote support using the respective assigned vehicle indicators.
Petition 870190048983, of 05/24/2019, p. 99/100
5/5
[14]
14. Method according to claim 13, CHARACTERIZED by the fact that it further comprises:
assign, by the vehicle remote support device, at least one vehicle from one of the remote vehicle support queues which is determined to be overloaded to another of the vehicle remote support queues which is determined to be not overloaded, in which to be overloaded is based on the status data of the assigned vehicles.
[15]
15. Method according to claim 6, CHARACTERIZED by the fact that the instruction data comprises an instruction to make the vehicle's autonomous operation follow a new route.
[16]
16. Method according to claim 6, CHARACTERIZED by the fact that the vehicle status data includes a vehicle theme state, the theme state indicating anyone between a normal state, a theme with a passenger, a theme with traffic, a physical theme of the vehicle and a theme related to a decision based on the autonomous operation of the vehicle.
[17]
17. Method according to claim 6, CHARACTERIZED by the fact that the map display includes an icon for each of the vehicle sets.
[18]
18. Method according to claim 13, CHARACTERIZED by the fact that it further comprises:
group at least some of the respective indicators based on shared entries in the state data.
[19]
19. Method according to claim 18, CHARACTERIZED by the fact that the shared entrances include a travel route for the vehicles.
类似技术:
公开号 | 公开日 | 专利标题
BR112019010723A2|2019-10-01|autonomous car teleoperation to negotiate problematic situations
JP2019500711A|2019-01-10|Software application that requests and controls autonomous vehicle services
JP6726363B2|2020-07-22|Autonomous vehicle monitoring using the generated interface
WO2017079290A1|2017-05-11|Coordination of dispatching and maintaining fleet of autonomous vehicles
US10591912B2|2020-03-17|Autonomous vehicle remote support mapping interface
JP2019537159A5|2020-04-16|
JP6716792B2|2020-07-01|Generation of solution data for autonomous vehicles to deal with problem situations
WO2017079321A1|2017-05-11|Sensor-based object-detection optimization for autonomous vehicles
BR112019011455A2|2019-10-15|bandwidth constrained image processing for autonomous vehicles
JP2021509992A|2021-04-08|Centralized shared autonomous vehicle operation management
JP2021512304A|2021-05-13|Computer framework for batch routing of autonomous vehicles
US20210356954A1|2021-11-18|Joint optimization of robotic vehicle routing for ride quality, safety, and operator demand
同族专利:
公开号 | 公开日
CN110235070B|2020-11-10|
JP2019537155A|2019-12-19|
MX2019006128A|2020-01-27|
US20200293065A1|2020-09-17|
EP3548979A4|2019-12-04|
EP3548979A1|2019-10-09|
JP6732129B2|2020-07-29|
WO2018102477A1|2018-06-07|
EP3548979B1|2021-06-23|
US10705539B2|2020-07-07|
US20190278298A1|2019-09-12|
CN110235070A|2019-09-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

CN1834585B|2006-05-08|2011-10-19|深圳市赛格导航科技股份有限公司|Dispatching guidance system of mining vehicle|
JP4899746B2|2006-09-22|2012-03-21|日産自動車株式会社|Route guidance display device|
CN101289059B|2007-04-17|2010-06-16|林修安|Fuel management method|
SE1150075A1|2011-02-03|2012-08-04|Scania Cv Ab|Method and management unit in connection with vehicle trains|
CN103875000B|2011-09-22|2016-04-06|阿索恩公司|For the monitoring of autonomous mobile robot, diagnosis and trace tool|
US10031520B2|2011-11-08|2018-07-24|The United States Of America, As Represented By The Secretary Of The Navy|System and method for predicting an adequate ratio of unmanned vehicles to operators|
US8855847B2|2012-01-20|2014-10-07|Toyota Motor Engineering & Manufacturing North America, Inc.|Intelligent navigation system|
JP6033838B2|2012-02-27|2016-11-30|ヤマハ発動機株式会社|Operating state sharing device and operating state sharing system|
JP2013196632A|2012-03-22|2013-09-30|Hitachi Kokusai Electric Inc|Communication system|
US10168674B1|2013-04-22|2019-01-01|National Technology & Engineering Solutions Of Sandia, Llc|System and method for operator control of heterogeneous unmanned system teams|
WO2015094807A1|2013-12-16|2015-06-25|Contour Hardening, Inc.|System and method for control of an electric vehicle|
US9720410B2|2014-03-03|2017-08-01|Waymo Llc|Remote assistance for autonomous vehicles in predetermined situations|
US9710976B2|2014-06-03|2017-07-18|Hyundai Motor Company|System and method for transmitting data of a vehicle|
US20160050315A1|2014-08-14|2016-02-18|Harman International Industries, Incorporated|Driver status indicator|
US9494935B2|2014-11-13|2016-11-15|Toyota Motor Engineering & Manufacturing North America, Inc.|Remote operation of autonomous vehicle in unexpected environment|
US9310802B1|2015-02-05|2016-04-12|Jaybridge Robotics, Inc.|Multi-operator, multi-robot control system with automatic vehicle selection|
US10345809B2|2015-05-13|2019-07-09|Uber Technologies, Inc.|Providing remote assistance to an autonomous vehicle|
JP5957744B1|2015-07-31|2016-07-27|パナソニックIpマネジメント株式会社|Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle|
JP5957745B1|2015-07-31|2016-07-27|パナソニックIpマネジメント株式会社|Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle|
US9632502B1|2015-11-04|2017-04-25|Zoox, Inc.|Machine-learning systems and techniques to optimize teleoperation and/or planner decisions|
US10401852B2|2015-11-04|2019-09-03|Zoox, Inc.|Teleoperation system and method for trajectory modification of autonomous vehicles|
US10627832B2|2016-03-18|2020-04-21|Florida Institute For Human And Machine Cognition, Inc.|Object management display|
DE102016218012A1|2016-09-20|2018-03-22|Volkswagen Aktiengesellschaft|Method for a data processing system for maintaining an operating state of a first autonomous vehicle and method for a data processing system for managing a plurality of autonomous vehicles|
US10322722B2|2016-10-14|2019-06-18|GM Global Technology Operations LLC|Method of controlling an autonomous vehicle|US10293818B2|2017-03-07|2019-05-21|Uber Technologies, Inc.|Teleassistance data prioritization for self-driving vehicles|
CN107589745B|2017-09-22|2021-04-16|京东方科技集团股份有限公司|Driving method, vehicle-mounted driving terminal, remote driving terminal, equipment and storage medium|
US10775806B2|2017-12-22|2020-09-15|Lyft, Inc.|Autonomous-vehicle dispatch based on fleet-level target objectives|
US11181909B2|2018-02-19|2021-11-23|Denso Ten Limited|Remote vehicle control device, remote vehicle control system, and remote vehicle control method|
JP2019156299A|2018-03-15|2019-09-19|株式会社デンソーテン|Vehicle remote control device and vehicle remote control method|
JP2020060841A|2018-10-05|2020-04-16|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America|Information processing method, and information processing system|
JP2020061119A|2018-10-05|2020-04-16|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America|Information processing method, and information processing system|
JP2020061121A|2018-10-05|2020-04-16|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America|Information processing method, and information processing system|
EP3907115A4|2019-03-08|2022-03-16|Mazda Motor|Arithmetic operation system for vehicle|
WO2020206224A1|2019-04-05|2020-10-08|Kohler Co.|Dual fuel generator|
WO2021042049A1|2019-08-30|2021-03-04|Optimus Ride, Inc.|Monitor assignment system and method|
CN113022540A|2020-04-17|2021-06-25|青岛慧拓智能机器有限公司|Real-time remote driving system and method for monitoring multiple vehicle states|
法律状态:
2021-10-05| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201662428002P| true| 2016-11-30|2016-11-30|
PCT/US2017/063821|WO2018102477A1|2016-11-30|2017-11-30|Tele-operation of autonomous cars to negotiate problem situations|
[返回顶部]